# Bidirectional Attention Mechanism
Llm2vec Sheared LLaMA Mntp
MIT
LLM2Vec is a simple solution for transforming decoder-only large language models into text encoders, achieved by enabling bidirectional attention, masked next-token prediction, and unsupervised contrastive learning.
Text Embedding
Transformers English

L
McGill-NLP
2,430
5
Chinese Xlnet Base
Apache-2.0
An XLNet pre-trained model for Chinese, aimed at enriching Chinese natural language processing resources and providing diversified choices for Chinese pre-trained models.
Large Language Model
Transformers Chinese

C
hfl
1,149
31
Featured Recommended AI Models